6.1 Structure and Quantity

57

Fig. 6.1 The procedures involved in carrying out an experiment, from conception to ultimate

dissemination

activities. The posterior part (upper II) is sometimes called “missing information” because

once the prior part (upper KK) is specified, the system still has the freedom, quantified

by upper II, to adopt different microstates. In a musical analogy, upper KK would correspond to

the structure of a Bach fugue and upper II to the freedom the performer has in making

interpretational choices while still respecting the structure. 9 One could say that the

magnitude of upper II corresponds to the degree of logical indeterminacy inhering in the

system, in other words that part of its description that cannot be formulated within

itself; it is the amount of selective information lacking.

. I can often be calculated according to the procedures described in the previous

section (the Hartley or Shannon index). If we need to quantifyupper KK, it can be done using

the concept of algorithmic information content (AIC) or Kolmogorov information,

which corresponds to the length of the most concise description of what is known

about the system (see Sect. 11.5). Hence, the total information German upper II10 is the sum of the

ensemble (Shannon) entropy upper II and the physical (Kolmogorov) entropy upper KK:

German upper I equals upper I plus upper K periodI = I + K .

(6.13)

Mackay (1950) proposed the terms “logon” for the structural (prior) information,

equivalent toupper KK in Eq. (6.13), and “metron” for the metrical (posterior) measurement.

The gain in information from a measurement (Eq. 6.7) falls wholly within the metrical

domain, of course, and within that domain, there is a prior and posterior component

(cf. Sect. 9.4).

To summarize, the Kolmogorov informationupper KK can be used to define the structure

of information and is calculated by considering the system used to make a mea-

surement. The result of the measurement is macroscopic, remembered information,

quantified by the Shannon index upper II. The gain in information equals [final (denoted

by subscript f) minusinitial (denoted by subscript i) information]:

upper I equals left parenthesis upper I Subscript normal f Baseline plus upper K right parenthesis minus left parenthesis upper I Subscript normal i Baseline plus upper K right parenthesis equals upper I Subscript normal f Baseline minus upper I Subscript normal i Baseline periodI = (If + K)(Ii + K) = IfIi .

(6.14)

In other words, it is unexceptionable to assume that the measurement procedure

does not change the structural information, although this must only be regarded as

a cautious, provisional statement. 11 Presumably, any measurement or series of mea-

9 Cf. Tureck (1995).

10 Called the physical information of a system by Zurek (1989).

11upper KK is in turn embedded within higher systems such as language, mathematics and general engi-

neering knowledge, embodying, too, much tacit knowledge.